Optoelectronic Systems Trained With Backpropagation Through Time
نویسندگان
چکیده
منابع مشابه
Improving Stability of Recurrent Neural Networks Trained with Backpropagation Through Time
Backpropagation through time (BPTT) is a natural generalization of backpropagation to recurrent real-time neural networks. The idea is simply to treat the network at each timestep as a separate layer in a multilayer network. Stability and information retention have long been limiting factors in deep multilayer neural networks. As an extension of deep nets to an unbounded number of layers, BPTT ...
متن کاملBackPropagation Through Time
This report provides detailed description and necessary derivations for the BackPropagation Through Time (BPTT) algorithm. BPTT is often used to learn recurrent neural networks (RNN). Contrary to feed-forward neural networks, the RNN is characterized by the ability of encoding longer past information, thus very suitable for sequential models. The BPTT extends the ordinary BP algorithm to suit t...
متن کاملUnbiasing Truncated Backpropagation through Time
Truncated Backpropagation Through Time (truncated BPTT, Jaeger (2005)) is a widespread method for learning recurrent computational graphs. Truncated BPTT keeps the computational benefits of Backpropagation Through Time (BPTT Werbos (1990)) while relieving the need for a complete backtrack through the whole data sequence at every step. However, truncation favors short-term dependencies: the grad...
متن کاملMemory-Efficient Backpropagation Through Time
We propose a novel approach to reduce memory consumption of the backpropagation through time (BPTT) algorithm when training recurrent neural networks (RNNs). Our approach uses dynamic programming to balance a trade-off between caching of intermediate results and recomputation. The algorithm is capable of tightly fitting within almost any user-set memory budget while finding an optimal execution...
متن کاملUnbiasing Truncated Backpropagation Through Time
Truncated Backpropagation Through Time (truncated BPTT, [Jae05]) is a widespread method for learning recurrent computational graphs. Truncated BPTT keeps the computational benefits of Backpropagation Through Time (BPTT [Wer90]) while relieving the need for a complete backtrack through the whole data sequence at every step. However, truncation favors short-term dependencies: the gradient estimat...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: IEEE Transactions on Neural Networks and Learning Systems
سال: 2015
ISSN: 2162-237X,2162-2388
DOI: 10.1109/tnnls.2014.2344002